翻訳と辞書
Words near each other
・ Real Things (2 Unlimited album)
・ Real Things (Joe Nichols album)
・ Real Monasterio de Santa Inés del Valle (Écija)
・ Real Mulia F.C.
・ Real Murcia
・ Real Murcia Imperial
・ Real Music from Chicago
・ Real Music Series
・ Real Muthaphuckkin G's
・ Real net output ratio
・ Real neutral particle
・ Real News
・ Real Nigga Roll Call
・ Real Nighttime
・ Real Noroeste Capixaba Futebol Clube
Real number
・ Real Ones
・ Real Onigokko
・ Real Onigokko (song)
・ Real Options Group
・ Real options valuation
・ Real Oviedo
・ Real Oviedo Vetusta
・ Real Palace
・ Real party in interest
・ Real People
・ Real People (album)
・ Real People (disambiguation)
・ Real People (song)
・ Real People / Wild East


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Real number : ウィキペディア英語版
Real number

In mathematics, a real number is a value that represents a quantity along a continuous line. The adjective ''real'' in this context was introduced in the 17th century by Descartes, who distinguished between real and imaginary roots of polynomials.
The real numbers include all the rational numbers, such as the integer −5 and the fraction 4/3, all the irrational numbers, such as (1.41421356…, the square root of two, an irrational algebraic number) and all transcendental numbers, such as (3.14159265…, a transcendental number). Real numbers can be thought of as points on an infinitely long line called the number line or real line, where the points corresponding to integers are equally spaced. Any real number can be determined by a possibly infinite decimal representation, such as that of 8.632, where each consecutive digit is measured in units one tenth the size of the previous one. The real line can be thought of as a part of the complex plane, and complex numbers include real numbers.
These descriptions of the real numbers are not sufficiently rigorous by the modern standards of pure mathematics. The discovery of a suitably rigorous definition of the real numbers – indeed, the realization that a better definition was needed – was one of the most important developments of 19th century mathematics. The currently standard axiomatic definition is that real numbers form the unique Archimedean complete totally ordered field up to an isomorphism,〔More precisely, given two complete totally ordered fields, there is a ''unique'' isomorphism between them. This implies that the identity is the unique field automorphism of the reals that is compatible with the ordering.〕 whereas popular constructive definitions of real numbers include declaring them as equivalence classes of Cauchy sequences of rational numbers, Dedekind cuts, or certain infinite "decimal representations", together with precise interpretations for the arithmetic operations and the order relation. These definitions are equivalent in the realm of classical mathematics.
The reals are uncountable; that is: while both the set of all natural numbers and the set of all real numbers are infinite sets, there can be no one-to-one function from the real numbers to the natural numbers: the cardinality of the set of all real numbers (denoted \mathfrak c and called cardinality of the continuum) is strictly greater than the cardinality of the set of all natural numbers (denoted \aleph_0). The statement that there is no subset of the reals with cardinality strictly greater than \aleph_0 and strictly smaller than \mathfrak c is known as the continuum hypothesis (CH). It is known to be neither provable nor refutable using the axioms of Zermelo–Fraenkel set theory (ZFC), the standard foundation of modern mathematics, in the sense that some models of ZFC satisfy CH, while others violate it.
== History ==

Simple fractions have been used by the Egyptians around 1000 BC; the Vedic "Sulba Sutras" ("The rules of chords") in, , include what may be the first "use" of irrational numbers. The concept of irrationality was implicitly accepted by early Indian mathematicians since Manava , who were aware that the square roots of certain numbers such as 2 and 61 could not be exactly determined.〔T. K. Puttaswamy, "The Accomplishments of Ancient Indian Mathematicians", pp. 410–1. In: .〕 Around 500 BC, the Greek mathematicians led by Pythagoras realized the need for irrational numbers, in particular the irrationality of the square root of 2.
The Middle Ages brought the acceptance of zero, negative, integral, and fractional numbers, first by Indian and Chinese mathematicians, and then by Arabic mathematicians, who were also the first to treat irrational numbers as algebraic objects, which was made possible by the development of algebra. Arabic mathematicians merged the concepts of "number" and "magnitude" into a more general idea of real numbers. The Egyptian mathematician Abū Kāmil Shujā ibn Aslam was the first to accept irrational numbers as solutions to quadratic equations or as coefficients in an equation, often in the form of square roots, cube roots and fourth roots.〔Jacques Sesiano, "Islamic mathematics", p. 148, in 〕
In the 16th century, Simon Stevin created the basis for modern decimal notation, and insisted that there is no difference between rational and irrational numbers in this regard.
In the 17th century, Descartes introduced the term "real" to describe roots of a polynomial, distinguishing them from "imaginary" ones.
In the 18th and 19th centuries, there was much work on irrational and transcendental numbers. Johann Heinrich Lambert (1761) gave the first flawed proof that cannot be rational; Adrien-Marie Legendre (1794) completed the proof,〔.〕 and showed that is not the square root of a rational number.〔.〕 Paolo Ruffini (1799) and Niels Henrik Abel (1842) both constructed proofs of the Abel–Ruffini theorem: that the general quintic or higher equations cannot be solved by a general formula involving only arithmetical operations and roots.
Évariste Galois (1832) developed techniques for determining whether a given equation could be solved by radicals, which gave rise to the field of Galois theory. Joseph Liouville (1840) showed that neither ''e'' nor ''e''2 can be a root of an integer quadratic equation, and then established the existence of transcendental numbers; Georg Cantor (1873) extended and greatly simplified this proof. Charles Hermite (1873) first proved that ''e'' is transcendental, and Ferdinand von Lindemann (1882), showed that is transcendental. Lindemann's proof was much simplified by Weierstrass (1885), still further by David Hilbert (1893), and has finally been made elementary by Adolf Hurwitz and Paul Gordan.
The development of calculus in the 18th century used the entire set of real numbers without having defined them cleanly. The first rigorous definition was given by Georg Cantor in 1871. In 1874, he showed that the set of all real numbers is uncountably infinite but the set of all algebraic numbers is countably infinite. Contrary to widely held beliefs, his first method was not his famous diagonal argument, which he published in 1891. See Cantor's first uncountability proof.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Real number」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.